#searching for a vital doc with literally -16 memory on where it could be
Explore tagged Tumblr posts
fuzziemutt · 8 months ago
Text
I hate being so paranoid and hiding all my shit sometimes because like
"teehee no one will ever find this important doc now :D"
My brother in Burger the no one includes the you !
3 notes · View notes
docfuture · 5 years ago
Text
Princess, part 1
     [This story is a prequel, set several years before The Fall of Doc Future, when Flicker is 16.  There are a few minor changes in continuity from the serialized versions of Fall and Skybreaker's Call, for reasons explained here (spoilers if you haven't read Fall and Call.)  Link to the latest chapter of The Maker’s Ark is here, and links to some of my other work are here.  Planning to update this at least once a week until it's done--next update is set to be up by October 19th.]
      When Flicker had formally started her career as a superhero, Doc Future had expressed reservations about her ability in two areas.  Not her power, technique, knowledge, or motivation, but two mundane, almost trivial seeming skills.  Flicker hadn't thought they would cause any major problems.       They did.       The skills were triage and pacing.  What to prioritize when she couldn't get to everything, when to stop, and when not to even go on call for the day because she hadn't recovered enough yet.       For someone who could respond to any emergency, anywhere in the world, they were essential.  She'd eventually dealt with the triage issue by setting general priorities and letting Doc's Database handle the rest--because that let her concentrate on pacing.  Which was still a problem.  One that was manageable, but not always well managed.       Because knowing that people somewhere were dying or getting hurt because she wasn't willing or able to push just a little further... bothered her.  A lot.       The problem wasn't a lack of time--she couldn't literally make time, but she could sure stretch it.  She could think at up to a million times human subjective speed, and move even faster.  But she eventually needed to slow down again to stay sane.  So her bottlenecks were the things she couldn't speed through, like long-term memory organization and biological recovery.  She still pushed--she was fast at them by human standards.  By her own, she was glacially slow.  And biological recovery--downtime while body and brain chemistry caught up with everything she'd done at speed--often felt like shirking responsibility, even when she desperately needed it.       She'd finished her active patrol shift for the day after hitting her stop threshold.  She'd saved lives.  That helped a little.  The estimated number of lives she saved had three digits.  That helped a little, too.  Neither of those helped as much as it hurt that the number was smaller than yesterday.  But she'd learned; keep pushing too long and her judgement and health suffered.       There were other things she needed to learn, too.  Continuing her education was one of the few things that let her recover at human speed without feeling guilty, and she had access to Doc's Database, which was, among other things, the best reference library in the world.  So Flicker had picked up a wide and detailed, if idiosyncratic, base of knowledge.  Learning and recovery were what she was doing now.  While on disaster watch, because her sense of duty wouldn't let her ignore the world's problems completely.  That was an occupational hazard for superheroes.       She was reading and tracing references on some subjects she found interesting, though not particularly pleasant.  Which felt appropriate, because her new Database threshold algorithm wasn't very pleasant either.  But it let her help with the worst events even when she was on the ragged edge of burnout.  She needed the structure, otherwise she wouldn't reliably stop; it was too easy to keep going down the global incident priority list from the Database doing 'just one more thing'.  There were always more things--her speed and the Database's realtime intel gathering and analysis meant she could intervene in a significant chunk of all accidents, fires, industrial mishaps, and assorted violence in the world.  And wipe herself out in ten minutes.  So limits were necessary.       As was what she was learning.       *****       Two days ago...       "Doc, I need advice on my next research topic," said Flicker.  "It's practical, not academic.  And I got several Database warnings and an advisory about where I was going to start."       Doc Future put down the scanner and pushed his goggles up on his forehead.  They were in his lab.  It was cluttered, as it usually was when he was working.  They had their differences, but Doc always made time for her sudden interruptions, and his suggestions were often helpful.  And he rarely tried to push any topic that wasn't vital.  Flicker didn't know if that was what normal humans needed in a parent, but it was what she needed.       "An advisory?  What topic?" he asked.       "Politics," said Flicker.  "I've screwed up too much because I don't understand it, and I haven't learned better because I dislike it.  But I need to."       "Reasonable," said Doc.       "The advisory was to talk to you if you had time.  I did a Database search for a good book to start learning the foundations of applied political science from, and got a list of suggestions.  And I did another search for useful guides for someone who has power but is still naive and doesn't want to get people killed out of ignorance.  And I got another list."       Flicker paused.  "One book was on both lists.  And the first warning was about the importance of context, proper translation, and annotations."       "One moment," said Doc.  Then, to the air:  "Database, command, pizza and drinks, break room."       "Acknowledged," came from the nearest lab speaker.       "Um," said Flicker.  "You don't need to drop everything--"       "Priorities," said Doc.  He shut down the scanner and a data recorder, then headed for the door.       Flicker frowned.  "How did you know I skipped lunch?"       Doc smiled crookedly.  "I didn't.  But there will be enough for both of us."
     "How is your pacing management going?" asked Doc, after his first slice of pizza.       "It's okay when I spend at least some time working with Jetgirl or Journeyman.  But she hit a long run of stuff I couldn't help with, then went on vacation, and Journeyman has been away dealing with that interdimensional magician thing again."       Flicker looked away.  "I had to do something about that fire in Lagos.  There wasn't anyone else; Virago was in Sudan, the Saharan was in Mali, and the fire department was stuck in traffic.  I trashed the building, but it all would have burned anyway, and I got the people out, so..."       She stopped and took a breath before starting again.  "Yeah.  Pacing is still a problem.  I set the off-duty interrupt thresholds even higher.  I hate doing it, though.  It feels like turning my back on the world--I get this angry helpless feeling.  Like the last time I tried to study how politics works.  But if I'm feeling that way anyway, I might as well deal with something important that I've been avoiding."       "I've been known to do that," said Doc.  "Jumping Spider calls it cage match coping.  Not ideal, but..."       "But we don't live in an ideal world," she finished.  Doc's standard response to her unanswerable 'but why' questions when she was younger.  "So, what's special about this book?  Other than the fact that the author's name has become a synonym for scheming?"       "Well," said Doc, "you were going to run into it eventually, because I encouraged you to consult historical sources if they're useful.  The Prince is still germane in ways Newton's Principia is not, it's far shorter and more readable, and it was written colloquially, in the style of a guide.  Machiavelli had personal, directly relevant experience--and a dark sense of humor.  But it's been five hundred years.  You do need that context.  Also, later reactions to it by others, and their distortions of it, can be illuminating."       "You seem to know a lot about it," said Flicker.       "Yes."  Doc waited for Flicker to finish a slice of pizza before continuing.  "Given your current stressors, I'd like to test something, if you're up for it."       "Possibly..."       "You've mentioned occasionally your frustration with my reluctance to conduct further experiments involving your high speed mind."       "Yeah.  I get the dangerous part, but ignorance is dangerous, too."       "There are always tradeoffs," said Doc.  "So I'd like you to speed up, skim The Prince--skip the background and any translator editorializing for now--summarize your impressions, then slow back down and compare them with your normal speed reaction."       "I won't retain much more than the summary unless I reread it at normal speed."       "You were going to do that anyway.  And it's short."       "Okay."       Flicker sped up, loaded the book into her visor, linked it to the virtual keyboard interface in the gloves of her costume, and started reading. "Yuck," she said after finishing and slowing down.  "Gratuitous misogyny right near the end.  That was obnoxious."       "Yes.  Welcome to a primary source from the Renaissance," said Doc.  "Any reaction discrepancy?"       "No, it was yuck at high speed, and still yuck when I slowed down.  Initial impressions... The book reads like an overly wordy 'Dictatorship for Dummies'.  I found it kind of uneven.  The case studies of unpleasant stuff that worked, at least for a while, were interesting.  Sounded like he was arguing against people who said it wouldn't.  His advice about things you shouldn't do was a lot more convincing than most of his advice about what you should do instead--there were some pretty dubious generalizations.  He seemed a lot better at specific topics where he had evidence--I liked that his advice about building fortresses was basically 'It depends'.  Some of his generalizations were pretty good, though; sociological rules that seem obvious."       "They weren't obvious when he wrote about them."       "Ah.  There was one part where I think he was trolling, but I don't know who."       "What was that?"       "Well, there was this long sentence with eleven conditionals over whether you should use this one guy as an example.  Sounded like Machiavelli was trolling people who didn't like the guy."       "He was known to troll.  But I believe the passage in question was driving home the point that you should not ignore evidence, even if it was generated by someone you don't like.  He was quite aware his views would annoy people.  It's also worth noting that The Prince was a work in progress, and not published during his lifetime.  It's not clear how close the versions we have are to what he intended to be final.  He died rather suddenly."  Doc took a sip of coffee and studied her.  "How coordinated was your normal speed reaction with your high speed one?"       "Pretty well.  As much as with anything else.  So... What does that tell you?  What were you testing?"       "Whether there were any warning signs of ethical inconsistency between the two parts of your mind.  Reading The Prince for the first time was an opportunity to test for that."       Flicker frowned.  "But it's not about ethics, it's about how to succeed as an autocrat.  It's notably ethics-free.  I thought that was the point?"       "Well, your consistency is a good sign."  Doc paused, with the look he got when he was estimating how long a lecture she'd be willing to sit still for.       "Go ahead," she said.  "You think this is important, and I already expected it to be unpleasant.  And we have food.  I'm not going anywhere unless a crisis alarm hits."       Doc snorted.  "All right.  Keep in mind that this is my summary, based on my own experience and judgement, and I'm leaving out a lot."       "Granted.  Go."       "The Prince was written as a how-to book, and rather bluntly pointed out problems with trying to base political actions on any then-current dogma or ethical systems.  That was implicitly a new ethical position, and very controversial.  Before Machiavelli, there were two main families of ethics in Western philosophy--deontology and virtue ethics.  After him, there were three, although there was quite a bit of thrashing around while various versions of the new one got elaborated and argued about.  You didn't notice the ethics in The Prince because it's a consequentialist book, you were educated as a consequentialist, and you're more likely to spot things you disagree with when skimming."       Flicker sighed.  "I understand consequentialism, but those others only seem to get talked about in connection with philosophy or religion, and the signal to noise ratio of that stuff is so low."       "That is a problem, yes," said Doc.  "Wild oversimplification incoming.  Something bad happens because someone screwed up.  What's their excuse?"       He waved a hand.  "If they say 'I meant well', they're appealing to virtue ethics.  If they say 'I followed the rules', that's deontology.  And if they say 'My plan failed', that's consequentialism."       "Ahh.  Now that's useful.  I wish...  Hang on."  She frowned again, then sped up to think and check some references in the Database.  It was a long subjective time before she slowed down again.       "I think I have an ethical problem.  Several problems.  I'm not sure if they're ethical, but they are problems.  Is the Volunteer a deontologist?"       "You could ask him."       "Doc."       "It's not a trivial question to answer.  His views have evolved.  He was raised as one, just like he was raised as a human.  He does have certain fixed principles, but he considers taking consequences into account as a moral imperative."  Doc smiled.  "Complicates the categories a bit.  And he doesn't expect people to try to emulate everything he does.  If nothing else, because most people aren't bulletproof and can't fly."       "Here's the problem," said Flicker.  "I learned ethics from you, and you're a consequentialist.  But I've been using him as a moral example.  I assumed that was okay because you get along.  Now, maybe not so much.  Is it?"       "I thought it was likely to require adjustments on your part as you grew up.  I also thought you'd talk to one of us if it became a problem.  You weren't particularly receptive to theoretical ethics discussion, so I focused on more immediate concerns.  Like discouraging any tendencies you might have towards impulsive mass destruction."       "Yeah."  Flicker looked down.  "Can deontology, like... sneak up on you?  Can rules that you started using for practical reasons or because you thought they were good fallbacks start acquiring moral baggage without you realizing it?"       "Oh yes.  Happens all the time.  Rules are easy to teach, easy to learn, and easy to reinforce.  And most people learn some form of rule-based morality as children, so it's familiar.  That's one reason that the Volunteer is so careful about what he says.  A lot of people look up to him, but others have been trying to twist the meaning of what he does to fit their own agenda since...  Well, it started to get bad in the fifties, and has never really gotten better."       "You didn't teach me that way.  Your rules always came with explanations.  Or even stories.  And I could use the Database scenario analyzer to explore just why something would be bad."       Doc nodded.  "You were an extreme case.  I didn't see any alternative to teaching you consequentialism as soon as we could reliably communicate, because of the amount of power you have.  Database access and your ability to speed up to think helped make it more practical.  And I think, based on your negative reaction to external rules without what you view as adequate justification, that somewhere in your missing memories prior to age nine, someone, or several someones, tried to impose a rule system on you."       "Yeah..."  Flicker watched the server bot refill her pop and add an ice cube as she thought.  "But when I come up with the rule myself, like by watching the Volunteer, and it matches my consequence evaluation enough, I start just accepting it.  And if it involves an emotional reaction when I'm at human speed, but consequences that are hard to evaluate fully without speeding up, it seems to sometimes pick up a moral value; I feel guilty if I try to ignore it.  I think that's why the feeling that I'm not doing my duty if I don't help out somewhere every day, even if I'm wiped, is so hard to shake."       "That's plausible.  It may be a consequence of whatever mechanism synchronizes moral judgement between the two parts of your mind; that's why I wanted to check on it."       Flicker looked back at him.  "Is this book likely to make my problems worse?"       "I don't think so."  Doc scratched his chin.  "But I do think your history and judgement synchronization are likely to be partly responsible for why you find reading about or discussing politics or ethics so unpleasant."       "Great.  I really wish I could talk to a human psychologist without putting a huge target on their back."       "Unlikely after what happened during the Lost Years.  And for any remotely safe intervention, they would need either training in a specialty that doesn't exist or years of experience with you."       "I'd settle for a smart generalist with personal experience in human emulation.  But that rules out most humans.  And I can't do a real Database search for one because of the privacy restrictions."       "The Database is frightening enough even with those restrictions.  But..."  Doc frowned and tapped at his handcomp.  "There, I set up a Database thread to analyze what a minimally intrusive search for someone appropriate for you to talk to might look like.  You'd need to think carefully about what you're looking for, and you wouldn't be able to access any of the raw data or decision process, even by override.  It still might not be practical.  You can discuss it with the Database integrity AI."       "Okay.  I'll talk to DASI about it."  Doc strongly disapproved of anthropomorphization of the Database.  But he was willing to let Flicker talk to DASI--the Database Algorithm Security and Integrity AI--for mental health reasons, so she was careful to maintain the distinction.       Doc started on another slice of pizza, and Flicker leaned back in her chair.       "In the meantime," she said, "I'll do a proper read of The Prince, and context on Machiavelli, and see if I can figure out why a consequentialist would write a handbook for autocrats."       Another crooked smile from Doc.  "He liked Florence a lot."       *****       Now.       Flicker had found the book and background material sometimes jarring and often depressing.  But it wasn't boring, and a few bits were full of insight.  Like: "...how one lives is so far distant from how one ought to live, that he who neglects what is done for what ought to be done, sooner effects his ruin than his preservation; for a man who wishes to act entirely up to his professions of virtue soon meets with what destroys him among so much that is evil."       She did not understand how anyone could interpret that as 'The end justifies the means'.  But a lot of people apparently had.  Flicker saw it as making clear that neither the world, nor people, nor you, were perfect, and if you didn't pay attention to that you were going to get in trouble--but it didn't justify anything.  You needed to get that somewhere else, if at all.       She wondered if any of the superheroes who died during the Lost Years had thought that acknowledging imperfection was equivalent to abandoning their principles.       She jumped back to a passage she'd read over and over.  Superheroes weren't political leaders, but there was definitely something there.  Still, it seemed a little... off.       "Upon this a question arises: whether it be better to be loved than feared or feared than loved? It may be answered that one should wish to be both, but, because it is difficult to unite them in one person, it is much safer to be feared than loved, when, of the two, either must be dispensed with."       Flicker wasn't sure about that.  The Volunteer was loved a lot and feared a little.  Virago and Nighthaunt were both feared a lot and loved a little.  All were successful, in different ways.       But safety didn't mean the same thing to her as it did to Machiavelli, and she wasn't sure how much that mattered.  She was still thinking about that when a special alarm flashed.       Flicker was off-duty, and still at Yellow endurance status--she hadn't fully recovered from her most recent shift.  But she had a short list of personal priority interrupts for the Database.  At the top was 'reckless speedster'.  And a crisis analysis appeared on her visor, right after the alarm:       Event:  Shockwave generation.  Building, infrastructure, and vehicle damage.  Probable injuries, potentially life-threatening.       Location:  Rome, Italy.       Tags:  Severe, ongoing, casualties, assessment lag.       Cause (extrapolated):  83%:  Hermes.  14%:  Unknown speedster.  3%:  Other.       Flicker was already in costume, and had sped up her mind.  Now she sped up her body and virtual typed a response.       Database, command, assign event cause neutralization to me.  Alert Doc, Box crisis intake, local emergency response, Interpol.  Start continuous updates by available channels, breadth over depth, cascading priority interrupts.       Acknowledged.       She sent further queries, ones that would take longer for the Database to answer, as she headed for the equipment bay.  Duty called.
Next: Part 2
11 notes · View notes